Write Test Plans That Catch Bugs Before Users Do — QA Engineering

By the end of this page, you will understand how QA Engineers write and execute test plans using pytest and vitest — and how AI agents can generate comprehensive test suites automatically.

Testing & Verification — The 2-Minute Overview

Chapter 11 Cartoon — Ship Day Surprise

Think about the last time you bought a car. You didn't see the thousands of crash tests, brake tests, engine stress tests, and safety inspections behind that vehicle. You just turned the key and drove. But somebody had to verify that every part works, every edge case is handled, and every safety standard is met — before it reached the showroom. That verification is QA Engineering. The diagram below is that map, zoomed out.

graph LR subgraph INPUT["Testing Inputs"] I1["Code from Development"] I2["Test Plan from Senior Dev"] I3["Acceptance Criteria from PRD"] end subgraph QA["QA Engineering"] Q1["Test Strategy — What to test, at what level"] Q2["Test Execution — pytest / vitest"] Q3["Defect Reporting — What failed and why"] end subgraph OUTPUT["Testing Outputs"] O1["Test Reports with Coverage"] O2["Defect Log"] O3["Definition of Done ✅"] end I1 --> Q1 I2 --> Q1 I3 --> Q3 Q1 --> Q2 Q2 --> Q3 Q3 --> O1 Q3 --> O2 Q3 --> O3 style INPUT fill:#16213e,stroke:#0f3460,color:#fff style QA fill:#1a1a2e,stroke:#e94560,color:#fff style OUTPUT fill:#006400,stroke:#00cc00,color:#fff

You Already Know QA — You Just Don't Know It Yet

You've been doing QA every time you proofread an important email before hitting send. Let's prove it.

Imagine you're sending a job application email to your dream company:


✉️ The Email Proofreading Analogy

Step 1 — You check structure: Subject line present? Greeting correct? Attachment included?

🔗 QA Layer: ① UNIT TESTING — Test individual components in isolation. Does each function return the correct output?

Step 2 — You check flow: Does paragraph 1 connect to paragraph 2? Does the closing match the tone?

🔗 QA Layer: ② INTEGRATION TESTING — Test component interactions. Do modules work together correctly?

Step 3 — You check edge cases: What if they open it on mobile? What if the attachment is too large? What if the link is broken?

🔗 QA Layer: ③ EDGE-CASE TESTING — Test boundaries, errors, and unusual scenarios.

Step 4 — You read it as the recipient: Does it make sense to someone who doesn't know your context?

🔗 QA Layer: ④ ACCEPTANCE TESTING — Does the product meet the user's acceptance criteria?

The Complete Mapping

Email ProofreadingQA EngineeringLevel
Check subject, greeting, attachmentTest each function individually① Unit Tests
Check paragraph flow and coherenceTest module interactions② Integration Tests
Check mobile rendering, link validityTest edge cases and boundaries③ Edge-Case Tests
Read as the recipientValidate against acceptance criteria④ Acceptance Tests
You just learned QA without writing a single test.


The 5 Pillars of QA Engineering

1. Test Strategy

Not all tests are equal. The strategy defines what to test, at what level, with what priority.

The testing pyramid: many unit tests (fast, cheap), fewer integration tests (moderate), few e2e tests (slow, expensive). The QA Engineer maps each acceptance criterion to a test level and sets coverage targets.

LevelWhat It TestsSpeedCostQuantity
UnitIndividual functionsFast (ms)LowMany (hundreds)
IntegrationModule interactions, API callsMedium (seconds)MediumModerate (dozens)
E2EFull user journeysSlow (minutes)HighFew (handful)

2. pytest (Backend Testing)

pytest is the Python testing standard — simple, powerful, and extensible.

Fixtures for setup/teardown. Parametrize for testing multiple inputs. Markers for categorizing tests. Coverage reports for identifying gaps.

ConceptWhat It MeansWhen to Use
FixturesReusable setup/teardown for testsDatabase connections, mock data
ParametrizeRun same test with different inputsValidating multiple scenarios
MarkersTag tests (slow, integration, smoke)Selective test execution
CoverageMeasure % of code executed by testsGap identification

3. vitest (Frontend Testing)

vitest is the Vite-native testing framework — fast, modern, and compatible with Jest.

Component testing for React components. Snapshot testing for UI regression. Mock testing for API calls. DOM testing for user interactions.

ConceptWhat It MeansWhen to Use
Component TestsTest React components in isolationEvery UI component
SnapshotsCapture and compare rendered outputUI regression detection
MocksSimulate API responsesTests without backend dependency
DOM TestingSimulate user clicks and inputsUser interaction flows

4. Definition of Done

Code isn't "done" when it compiles. It's "done" when all tests pass, coverage meets target, and acceptance criteria are verified.

The Definition of Done is a checklist: all unit tests pass, integration tests pass, coverage > 80%, no critical defects, acceptance criteria verified, code reviewed, and merged to DEV.

CriterionWhat It MeansVerified By
All Tests PassZero failures across all test levelsCI/CD pipeline
Coverage Target Met≥80% line coverageCoverage report
No Critical DefectsZero P0/P1 bugs openDefect log
Acceptance Criteria MetEvery AC verified with a testTest-to-AC mapping

5. Defect Reporting & Triage

A bug report without reproduction steps is noise. A bug report with root cause is a gift.

Good defect reports include: summary, steps to reproduce, expected vs. actual behavior, severity, and root cause hypothesis. Triage categorizes by severity and assigns to the right developer.

SeverityDefinitionResponse Time
P0 — CriticalSystem unusable, data loss riskImmediate — stop current work
P1 — HighMajor feature broken, workaround existsWithin sprint
P2 — MediumMinor feature broken, low impactNext sprint
P3 — LowCosmetic, no functional impactBacklog

The Complete Mapping

#PillarWhat It AnswersKey Decision
Test StrategyWhat to test and at what level?Pyramid: unit > integration > e2e
pytestHow to test the backend?Fixtures, parametrize, coverage
vitestHow to test the frontend?Components, snapshots, mocks
Definition of DoneWhen is code truly "done"?Tests + coverage + AC verification
Defect ReportingHow to report and triage bugs?Severity + reproduction + root cause
That's it. Master these 5 pillars, master QA.


Try It Yourself — A Starter Prompt for Test Plan Generation

This prompt gives you a working starting point. For the complete prompt — with test-to-AC mapping, coverage gap analysis, and defect triage workflows — see the full course chapter →.
You are a Senior QA Engineer with experience in pytest and vitest.

I need a test plan for:

{{PASTE YOUR FEATURE CODE OR REQUIREMENTS}}

Cover these 5 areas:

1. TEST STRATEGY — Map each requirement to a test level (unit, integration, e2e).
2. BACKEND TESTS — Write pytest test cases for every backend function. Include edge cases.
3. FRONTEND TESTS — Write vitest test cases for every UI component. Include user interaction tests.
4. COVERAGE TARGETS — Set coverage targets per module and identify potential gaps.
5. DEFINITION OF DONE — Define the checklist for "done" including test pass rates and coverage thresholds.

For each area, provide: the test cases and a brief justification.

Format as a structured document with tables where appropriate.

What This Prompt Covers vs. What It Misses

SkillLite Prompt (Free)Full Prompt (Course)Impact of Missing It
Test strategy mapping✅ Covered✅ Covered
pytest + vitest tests✅ Covered✅ Covered
Coverage targets✅ Covered✅ Covered
Test-to-acceptance-criteria mapping❌ Missing✅ Every AC linked to specific testsAC exists but no test verifies it — discovered in UAT
Negative test cases⚠️ "Edge cases"✅ Explicit negative paths: invalid input, auth failure, concurrencyHappy path tested thoroughly — first bad input crashes the system
Test data strategy❌ Missing✅ Fixtures, factories, seed data designTests use hardcoded data that breaks when schema changes
Flaky test detection❌ Missing✅ Retry policies and environment isolation5% of tests fail randomly — team ignores all test failures. Trust erodes.
Performance test hooks❌ Missing✅ Response time assertions within functional testsFeature works correctly but takes 8 seconds — no test caught the latency
The Lite Prompt gets you to ~60% quality. Good enough to have a test plan. Not good enough to catch the bugs that reach production.


Real-World Example: Test Plan for a User Registration Feature

The Requirement

"Test a user registration feature: email validation, password strength check, duplicate account detection, welcome email trigger, and database record creation."

Lite Prompt Output — High-Level Test Plan

① Test Strategy

Unit tests for email validation and password check. Integration test for registration flow. E2E test for full signup journey.

② Backend Tests (pytest)

test_valid_email, test_invalid_email, test_strong_password, test_weak_password, test_duplicate_email, test_registration_creates_user.

③ Frontend Tests (vitest)

test_form_renders, test_submit_with_valid_data, test_error_on_invalid_email.

④ Coverage Target

80% line coverage for registration module.

⑤ Definition of Done

All tests pass. 80% coverage. No P0 bugs.


What a QA Lead Would Catch

AreaLite Output SaysWhat's MissingReal-World Consequence
Strategy"Unit + Integration + E2E"No test priority. Which tests run first in CI? Which block deployment?CI takes 20 minutes. Critical unit test failure hidden behind slow E2E suite. Bug discovered late.
Backend"test_duplicate_email"No concurrency test. What if 2 users register with same email simultaneously?Race condition: both registrations succeed. Two accounts with same email. Data integrity broken.
Frontend"test_error_on_invalid_email"No test for password strength indicator. No test for form state after error.User types weak password, no visual feedback. User submits, gets server error. Bad UX.
Coverage"80% coverage"Which 20% is uncovered? Is it error handling code? That's the most critical 20%.Coverage reports "80% ✅" but the uncovered code is the exception handlers. First real error = unhandled crash.
DoD"All tests pass, 80% coverage, no P0"No AC-to-test mapping. Is "welcome email sent" tested?Registration works but welcome email never fires. Nobody noticed because no test checks it.
The pattern: The Lite Prompt asks "what are the test cases?" The full course asks "what are the test cases, what do they miss, and what bug escapes to production?"


Ready to Write Test Plans That Catch Everything?

Enroll in the Fresh Graduate AI SDLC Course →

Go from "I write tests" to "I write test suites that catch bugs before users do."
← Chapter 10 Course Home Chapter 12 →